Loss Function
Perceptron
$$ \begin{aligned} \ell(y, p) &= -p \cdot y, &p = h(\underline{w} \cdot \underline{x}) \end{aligned} $$Cross-Entropy
$$ \begin{aligned} \ell(\underline{y}, \underline{p}) &= \sum_{c=1}^{C} 1_{[y_c = 1]} \cdot \log \frac{1}{p_c} \\ &= \underline{y} \odot \log \frac{1}{\underline{p}} \end{aligned} $$Expand for special case:
- Logistic Regression:
- Softmax Regression: